Iterative Decoding of Binary Block and Convolutional Codes - Information Theory, IEEE Transactions on

نویسندگان

  • Joachim Hagenauer
  • Lutz Papke
چکیده

AbstructIterative decoding of two-dimensional systematic convolutional codes has been termed “turbo” (de)coding. Using log-likelihood algebra, we show that any decoder can be used which accepts soft inputs-including a priori values-and delivers soft outputs that can be split into three terms: the soft channel and a priori inputs, and the extrinsic value. The extrinsic value is used as an a priori value for the next iteration. Decoding algorithms in the log-likelihood domain are given not only for convolutional codes but also for any linear binary systematic block code. The iteration is controlled by a stop criterion derived from cross entropy, which results in a minimal number of iterations. Optimal and suboptimal decoders with reduced complexity are presented. Simulation results show that very simple component codes are sufficient, block codes are appropriate for high rates and convolutional codes for lower rates less than 213 . Any combination of block and convolutional component codes is possible. Several interleaving techniques are described. At a bit error rate (BER) of lop4 the performance is slightly above or around the bounds given by the cutoff rate for reasonably simple blockkonvolutional component codes, interleaver sizes less than 1000 and for three to six iterations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Iterative decoding of binary block and convolutional codes

Iterative decoding of two-dimensional systematic convolutional codes has been termed “turbo” (de)coding. Using log-likelihood algebra, we show that any decoder can he used which accepts soft inputs-including a priori values-and delivers soft outputs that can he split into three terms: the soft channel and a priori inputs, and the extrinsic value. The extrinsic value is used as an a priori value...

متن کامل

On the Intractability of Permuting a Block Code to Minimize Trellis Complexity [Correspondence] - Information Theory, IEEE Transactions on

A novel trellis design technique for both block and convolutional codes based on the Shannon product of component block codes is introduced. Using the proposed technique, structured trellises for block and convolutional codes have been designed. It is shown that the designed trellises are minimal and allow reduced complexity Viterbi decoding. Zndex Terms-Linear codes, trellis structure, product...

متن کامل

Near-optimum decoding of product codes: block turbo codes

This paper describes an iterative decoding algorithm for any product code built using linear block codes. It is based on soft-input/soft-output decoders for decoding the component codes so that near-optimum performance is obtained at each iteration. This soft-input/soft-output decoder is a Chase decoder which delivers soft outputs instead of binary decisions. The soft output of the decoder is a...

متن کامل

New multilevel codes over GF(q)

In this paper, we apply set partitioning to multi-dimensional signal spaces over GF(q), particularly GFq-l(q) and GFq(q), and show how to construct both multi-level block codes and multi-level trellis codes over GF(q). We present two classes of multi-level (n, k, d) block codes over GF(q) with block length n, number of information symbols k, and minimum ,1-1 • { d } distance d_n >_ d, where n =...

متن کامل

On the error exponent for woven convolutional codes with inner warp

In this correspondence the error exponent and the decoding complexity of binary woven convolutional codes with outer warp and with binary convolutional codes as outer and inner codes are studied. It is shown that an error probability that is exponentially decreasing with the product of the outer and inner code memories can be achieved with a nonexponentially increasing decoding complexity.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004